The subdifferential descent method in a nonsmooth variational problem

نویسندگان

چکیده

The paper is devoted to the classical variational problem with a nonsmooth integrand of functional be minimized. supposed subdifferentiable. Under some natural conditions subdifferentiability considered proved. finding subdifferential descent direction solved and method applied solve original problem. algorithm developed demonstrated by examples.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Descent Method for Nonsmooth Variational Inequalities via Regularization

in this paper we propose a descent method for solving variational inequality problems where the underlying operator is nonsmooth, locally Lipschitz, and monotone over a closed, convex feasible set. The idea is to combine a descent method for variational inequality problems whose operators are nonsmooth, locally Lipschitz, and strongly monotone, with the Tikonov-Browder regularization technique....

متن کامل

A globally convergent descent method for nonsmooth variational inequalities

We propose a descent method via gap functions for solving nonsmooth variational inequalities with a locally Lipschitz operator. Assuming monotone operator (not necessarily strongly monotone) and bounded domain, we show that the method with an Armijo-type line search is globally convergent. Finally, we report some numerical experiments.

متن کامل

Weak Subdifferential in Nonsmooth Analysis and Optimization

Some properties of the weak subdifferential are considered in this paper. By using the definition and properties of the weak subdifferential which are described in the papers Azimov and Gasimov, 1999; Kasimbeyli and Mammadov, 2009; Kasimbeyli and Inceoglu, 2010 , the author proves some theorems connecting weak subdifferential in nonsmooth and nonconvex analysis. It is also obtained necessary op...

متن کامل

A Weighted Mirror Descent Algorithm for Nonsmooth Convex Optimization Problem

Large scale nonsmooth convex optimization is a common problem for a range of computational areas including machine learning and computer vision. Problems in these areas contain special domain structures and characteristics. Special treatment of such problem domains, exploiting their structures, can significantly improve the computational burden. We present a weighted Mirror Descent method to so...

متن کامل

A Coordinate Gradient Descent Method for Nonsmooth Nonseparable Minimization

This paper presents a coordinate gradient descent approach for minimizing the sum of a smooth function and a nonseparable convex function. We find a search direction by solving a subproblem obtained by a second-order approximation of the smooth function and adding a separable convex function. Under a local Lipschitzian error bound assumption, we show that the algorithm possesses global and loca...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Optimization Letters

سال: 2022

ISSN: ['1862-4480', '1862-4472']

DOI: https://doi.org/10.1007/s11590-022-01897-3